On Lower Bounds for Statistical Learning Theory
نویسندگان
چکیده
منابع مشابه
On Lower Bounds for Statistical Learning Theory
In recent years, tools from information theory have played an increasingly prevalent role in statistical machine learning. In addition to developing efficient, computationally feasible algorithms for analyzing complex datasets, it is of theoretical importance to determine whether such algorithms are “optimal” in the sense that no other algorithm can lead to smaller statistical error. This paper...
متن کاملLower Bounds on Learning Random Structures with Statistical Queries
We show that random DNF formulas, random log-depth decision trees and random deterministic finite acceptors cannot be weakly learned with a polynomial number of statistical queries with respect to an arbitrary distribution.
متن کاملStatistical Learning Theory Instructors : R . Castro and A . Singh Lecture 17 : Minimax Lower Bounds Lower Performance Bounds
* An observation model, Pf , indexed by f ∈ F . Pf denotes the distribution of the data under model f . E.g. In regression and classification, this is the distribution of Z = (X1, Y1, . . . , Xn, Yn) ⊆ Z. We will assume that Pf is a probability measure on the measurable space (Z,B). * A performance metric d(., .). ≥ 0. If you have a model estimate f̂n, then the performance of that model estimate...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Entropy
سال: 2017
ISSN: 1099-4300
DOI: 10.3390/e19110617